Emotional Robots

4 min read

Is it possible for machines to feel anger, fear or joy and recognize such emotions in people?

We can consider the well-known feeling of fear to be the product of evolution. It is there to protect our species from dangers. When in fear, we either flee, look for a hiding place or disappear from the view of the people or objects that threaten us. Alternatively, we fight, summon our strength to repel an attack, and prepare our bodies for exertion. Fear can cause us to modify our behaviors. It can drive us to change our life choices and grow more mature. Whether positive or negative, our emotions not only reflect changes in our environment but also trigger people to create value systems. As some theories have it, our quest to perceive the world as good vs. evil is fueled by basic emotions such as fear, joy, sadness and anger.

Algorithms learn to look, listen and speak

Just as it is difficult for us to imagine a robot blowing its lid and throwing around plates in a fit of rage, it is hard to picture one that comes up to us to embrace and comfort us when we are feeling blue. It is equally unthinkable for a robot to choose to venture into a burning house to save our children. Isn’t it futile therefore to even wonder about machine emotions? Not entirely. With advances in AI technology, we are increasingly better at triggering mechanisms that make machines display behaviors previously considered to be the exclusive domain of humans and animals. Algorithms are bound to get better at communicating through language and recognizing shapes, colors, and voice intonations. This will inevitably broaden the range of machine responses to images, situations, and complex features and properties in the outside world. Technology is ever more adept at reading the environment because machines’ neural networks see the various properties and features of the world around us, such as sounds, colors, and smells, as data sets they can analyze, process, and learn from to draw conclusions and make decisions.

I listen to your heartbeat

Such machine reactions do not result from evolutionary processes nor are they biochemically induced. For a robot to experience emotions that resemble human anger or joy, such emotions would have to be programmed by people or learned by neural networks through interactions with data. What we would be dealing with as a result wouldn’t be emotions evoked on a computer disk but rather a set of prescribed or learned mechanical responses to specific stimuli. However, even if a robot is unable to spontaneously say: “I can see you are sad, can I help you?”, it could still recognize the signs of sadness in your facial expression and react to them in its limited way. While getting to this point would most likely take years of trials and require further advances in robotics, AI technology, mathematics, and statistics, it is not entirely beyond the bounds of possibility. According to Erik Brynjolfsson, professor at MIT Sloan, although people have an edge over machines in reading emotions, machines are gradually getting better at performing this task. They can recognize voice timbres and modulations and correlate them with stress and anger. They can also analyze images and capture the subtleties of facial microexpressions even better than humans. A case in point that shows the significant headway having been made in the field is the development of wearable tech, the increasingly popular and powerful smartwatches and a whole array of workout gadgets. All these devices rely on algorithms that record, process and analyze the heart rate and body temperature. Algorithms not only register such information but also use it to draw conclusions about humans. For if not by drawing conclusions, how else can a gadget know and suggest that you should run more, sleep longer or eat better?

A monitoring app will relieve your stress

This precisely was the approach of MIT Media Lab researchers in developing a device designed to monitor a person’s heartbeat to measure levels of stress, pain, and frustration. Its most interesting part was a monitor connected to the app that would trigger the release of a fragrance to help users cope with the negative emotions they were experiencing. Thus, the machine followed a pattern of empathetic behavior. Signals from the human body, app responses to such signals and people’s conscious behaviors have been correlated in further efforts to develop technologies that recognize various displays of emotions.

The difficult art of understanding jokes

For the time being, many hurdles loom ahead for researchers to overcome. One of them is the problem of context recognition, which is familiar to anyone involved in developing voice assistants and bots. While the assistants can easily tell you today’s weather, the meaning of “Should I take my umbrella with me today?” or any similar question is totally lost on them. Machines struggle to read nuanced meanings buried between the lines of such statements. They are equally stumped when faced with jokes or any other statements with surprising punchlines. Such difficulties are a key challenge in coding responses to emotions. How do you teach a machine to distinguish a grimace of suffering from an exaggerated angry face you make in jest in response to your dog’s latest prank? What would a robot make of your big eyes or of seeing you place your hands on your head? Would the machine take it to signify positive excitement or fear? I think we still have ways to go to be able to program machines to read such gestures accurately. Yet, even today it is clear that such accuracy is achievable and that success in doing so is only a matter of time and of building on what machines can do today.

Learn to like robots

The Japanese see robots differently than we do in Europe. While we are terrified by visions of a dehumanized world and anxious about machines stealing our office jobs, people in Japan are much more relaxed and affectionate towards them. This may well be due, at least in part, to their belief system that allows for inanimate objects to have souls. Although robots are not living creatures, you can nevertheless be emotional about them – you can like them or even feel empathy towards them. This goes to show that man-machine relationships can cover a very broad spectrum of behaviors and emotions. Hence, our attitudes may be an additional key factor to be considered in developing machine emotions.

Today, the boundaries between man and machine and between the world of human emotions and algorithmic responses seem to be perfectly clear. However, since artificial intelligence is in a continuous state of flux as its development continues, our views on these matters may also evolve. We should therefore wait until, at the sight of a machine, someone cries out: “Look, it smiled!”, and not rule out the possibility that this will indeed come to pass sooner or later.

.    .   .

Works cited:

Recode, Kara Swisher, Zuckerberg: The Recode interview, Everything was on the table — and after Facebook’s wildest year yet, that’s a really big table, Link, 2020.

Think Automation, A lighter side to AI: positive artificial intelligence quotes, Link, 2018.

Google ScholarErik Brynjolfsson, The second machine age: Work, progress, and prosperity in a time of brilliant technologiesLink, 2020.

.    .   .

Related articles:

Algorithms born of our prejudices

How to regulate artificial intelligence?

Artificial Intelligence is an efficient banker

Will algorithms commit war crimes?

Machine, when will you learn to make love to me?

Artificial Intelligence is a new electricity

Norbert Biedrzycki Head of Services CEE at Microsoft. Leads Microsoft services in 36 countries which include business and technology consulting, in particular in areas such as big data and AI, business applications, cybersecurity, premium and cloud services. Previously, as a Vice President Digital McKinsey, responsible for CEE, providing holistic combination of strategic consulting, digital transformation through rapid deployment of business applications, big data solutions and advanced analytics, business use of artificial intelligence, blockchain and IoT. Prior to that, Norbert was as the President of the Management Board and CEO of Atos Polska, and was also the CEO of ABC Data S.A. and the President of the Management Board and CEO of Sygnity S.A. He had previously also worked for McKinsey as a partner and, at the beginning of his career, he was the head of Oracle's consulting and business development services. Norbert's passion is technology – he is interested in robotization, automation, Artificial Intelligence, blockchain, VR, AR, and IoT and the impact modern technologies have on our economy and society. You can read more on this on his blog.

Leave a Reply

Your email address will not be published. Required fields are marked *